Riemannian Optimization for High-Dimensional Tensor Completion
نویسنده
چکیده
Tensor completion aims to reconstruct a high-dimensional data set with a large fraction of missing entries. The assumption of low-rank structure in the underlying original data allows us to cast the completion problem into an optimization problem restricted to the manifold of fixed-rank tensors. Elements of this smooth embedded submanifold can be efficiently represented in the tensor train (TT) or matrix product states (MPS) format with storage complexity scaling linearly with the number of dimensions. We present a nonlinear conjugate gradient scheme within the framework of Riemannian optimization which exploits this favorable scaling. Numerical experiments and comparison to existing methods show the effectiveness of our approach for the approximation of multivariate functions. Finally, we show that our algorithm can obtain competitive reconstructions from uniform random sampling of few entries of compared to adaptive sampling techniques such as cross-approximation.
منابع مشابه
Riemannian Tensor Completion with Side Information
Riemannian optimization methods have shown to be both fast and accurate in recovering a large-scale tensor from its incomplete observation. However, in almost all recent Riemannian tensor completion methods, only low rank constraint is considered. Another important fact, side information or features, remains far from exploiting within the Riemannian optimization framework. In this paper, we exp...
متن کاملLow-rank tensor completion: a Riemannian manifold preconditioning approach
We propose a novel Riemannian manifold preconditioning approach for the tensor completion problem with rank constraint. A novel Riemannian metric or inner product is proposed that exploits the least-squares structure of the cost function and takes into account the structured symmetry that exists in Tucker decomposition. The specific metric allows to use the versatile framework of Riemannian opt...
متن کاملLow-Rank Tensor Completion by Riemannian Optimization∗
In tensor completion, the goal is to fill in missing entries of a partially known tensor under a low-rank constraint. We propose a new algorithm that performs Riemannian optimization techniques on the manifold of tensors of fixed multilinear rank. More specifically, a variant of the nonlinear conjugate gradient method is developed. Paying particular attention to the efficient implementation, ou...
متن کاملRiemannian preconditioning for tensor completion
We propose a novel Riemannian preconditioning approach for the tensor completion problem with rank constraint. A Riemannian metric or inner product is proposed that exploits the least-squares structure of the cost function and takes into account the structured symmetry in Tucker decomposition. The specific metric allows to use the versatile framework of Riemannian optimization on quotient manif...
متن کاملA Dual Framework for Low-rank Tensor Completion
One of the popular approaches for low-rank tensor completion is to use the latent trace norm as a low-rank regularizer. However, most of the existing works learn a sparse combination of tensors. In this work, we fill this gap by proposing a variant of the latent trace norm which helps to learn a non-sparse combination of tensors. We develop a dual framework for solving the problem of latent tra...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- SIAM J. Scientific Computing
دوره 38 شماره
صفحات -
تاریخ انتشار 2016